14 research outputs found

    Investigating the build-up of precedence effect using reflection masking

    Get PDF
    The auditory processing level involved in the build‐up of precedence [Freyman et al., J. Acoust. Soc. Am. 90, 874–884 (1991)] has been investigated here by employing reflection masked threshold (RMT) techniques. Given that RMT techniques are generally assumed to address lower levels of the auditory signal processing, such an approach represents a bottom‐up approach to the buildup of precedence. Three conditioner configurations measuring a possible buildup of reflection suppression were compared to the baseline RMT for four reflection delays ranging from 2.5–15 ms. No buildup of reflection suppression was observed for any of the conditioner configurations. Buildup of template (decrease in RMT for two of the conditioners), on the other hand, was found to be delay dependent. For five of six listeners, with reflection delay=2.5 and 15 ms, RMT decreased relative to the baseline. For 5‐ and 10‐ms delay, no change in threshold was observed. It is concluded that the low‐level auditory processing involved in RMT is not sufficient to realize a buildup of reflection suppression. This confirms suggestions that higher level processing is involved in PE buildup. The observed enhancement of reflection detection (RMT) may contribute to active suppression at higher processing levels

    Multisensory integration of redundant and complementary cues

    No full text
    During multisensory integration, information from distinct sensory systems that refers to the same physical event is combined. For example, the sound and image that an individual generates as s/he interacts with the world, will provide the nervous system with multiple cues which can be integrated to estimate the individual’s position in the environment. However, the information that is perceived through different sensory pathways/systems can be qualitatively different. The information can be redundant and describe the same property of an event in a common reference frame (i.e., the image and sound referring to the individual’s location), or it can be complementary. Combining complementary information can be advantageous in that it extends the range and richness of the information available to the nervous system, but can also be superfluous and unnecessary to the task at hand – i.e. olfactory cues about the individuals perfume can increase the richness of the representation but not necessarily aid in localisation. Over the last century or so, a large body of research has focused on different aspects of multisensory interactions at both the behavioural and neural levels. It is currently unclear whether the mechanisms underlying multisensory interactions for both type of cue are similar or not. Moreover, the evidence for differences in behavioural outcome, dependent on the nature of the cue, is growing. Such cue property effects possibly reflect a processing heuristic for more efficient parsing of the vast amount of sensory information available to the nervous system at any one time. The present thesis assesses the effects of cue properties (i.e., redundant or complementary) on multisensory processing and reports a series of experiments demonstrating that the nature of the cue, defined by the task of the observer, influences whether the cues compete for representation as a result of interacting, or whether instead multisensory information produces an optimal increase in reliability of the event estimate. Moreover, a bridging series of experiments demonstrate the key role of redundancy in inferring that two signals have a common physical cause and should be integrated, despite conflict in the cues. The experiments provide insights into the different strategies adopted by the nervous system and some tentative evidence for possible, distinct underlying mechanisms.</p

    Temporal ventriloquism in a purely temporal context

    No full text
    Hartcher O'Brien J, Alais DM. Temporal ventriloquism in a purely temporal context. Journal of Experimental Psychology: Human Perception and Performance. 2011;37(5):1383-1395

    Multisensory integration of redundant and complementary cues

    No full text
    During multisensory integration, information from distinct sensory systems that refers to the same physical event is combined. For example, the sound and image that an individual generates as s/he interacts with the world, will provide the nervous system with multiple cues which can be integrated to estimate the individual’s position in the environment. However, the information that is perceived through different sensory pathways/systems can be qualitatively different. The information can be redundant and describe the same property of an event in a common reference frame (i.e., the image and sound referring to the individual’s location), or it can be complementary. Combining complementary information can be advantageous in that it extends the range and richness of the information available to the nervous system, but can also be superfluous and unnecessary to the task at hand – i.e. olfactory cues about the individuals perfume can increase the richness of the representation but not necessarily aid in localisation. Over the last century or so, a large body of research has focused on different aspects of multisensory interactions at both the behavioural and neural levels. It is currently unclear whether the mechanisms underlying multisensory interactions for both type of cue are similar or not. Moreover, the evidence for differences in behavioural outcome, dependent on the nature of the cue, is growing. Such cue property effects possibly reflect a processing heuristic for more efficient parsing of the vast amount of sensory information available to the nervous system at any one time. The present thesis assesses the effects of cue properties (i.e., redundant or complementary) on multisensory processing and reports a series of experiments demonstrating that the nature of the cue, defined by the task of the observer, influences whether the cues compete for representation as a result of interacting, or whether instead multisensory information produces an optimal increase in reliability of the event estimate. Moreover, a bridging series of experiments demonstrate the key role of redundancy in inferring that two signals have a common physical cause and should be integrated, despite conflict in the cues. The experiments provide insights into the different strategies adopted by the nervous system and some tentative evidence for possible, distinct underlying mechanisms.</p

    Outdoor workers and sun protection : knowledge and behaviour

    No full text
    200

    Extending visual dominance over touch for input off the body

    No full text
    Hartcher O'Brien J, Levitan CA, Spence CJ. Extending visual dominance over touch for input off the body. Brain Research. 2011;1362:48-55

    Nonlinear temporal distortions in vision and audition

    No full text
    Hartcher O'Brien J, Telgen S, Di Luca M, Ernst MO. Nonlinear temporal distortions in vision and audition. Presented at the 11th International Multisensory Research Forum (IMRF 2010)

    The multisensory perception of synchrony

    No full text
    Spence C, Navarra J, Vatakis A, Hartcher O'Brien J, Parise C. The multisensory perception of synchrony. Presented at the European Conference on Visual Perception (2009), Regensburg, De

    Naturalistic Stimulus Structure Determines the Integration of Audiovisual Looming Signals in Binocular Rivalry

    Get PDF
    <div><p>Rapid integration of biologically relevant information is crucial for the survival of an organism. Most prominently, humans should be biased to attend and respond to looming stimuli that signal approaching danger (e.g. predator) and hence require rapid action. This psychophysics study used binocular rivalry to investigate the perceptual advantage of looming (relative to receding) visual signals (i.e. looming bias) and how this bias can be influenced by concurrent auditory looming/receding stimuli and the statistical structure of the auditory and visual signals.</p><p>Subjects were dichoptically presented with looming/receding visual stimuli that were paired with looming or receding sounds. The visual signals conformed to two different statistical structures: (1) a ‘simple’ random-dot kinematogram showing a starfield and (2) a “naturalistic” visual Shepard stimulus. Likewise, the looming/receding sound was (1) a simple amplitude- and frequency-modulated (AM-FM) tone or (2) a complex Shepard tone. Our results show that the perceptual looming bias (i.e. the increase in dominance times for looming versus receding percepts) is amplified by looming sounds, yet reduced and even converted into a receding bias by receding sounds. Moreover, the influence of looming/receding sounds on the visual looming bias depends on the statistical structure of both the visual and auditory signals. It is enhanced when audiovisual signals are Shepard stimuli.</p><p>In conclusion, visual perception prioritizes processing of biologically significant looming stimuli especially when paired with looming auditory signals. Critically, these audiovisual interactions are amplified for statistically complex signals that are more naturalistic and known to engage neural processing at multiple levels of the cortical hierarchy.</p></div

    Sound waveforms and time-frequency representations (A–D).

    No full text
    <p>Sound waveforms (left) and time-frequency representations (right) of the amplitude and frequency modulated (AM-FM) tone (top: A, C) and the Shepard tone (bottom: B, D).</p
    corecore